A pseudo-philosophical look at robot car justice

Kinja'd!!! "TrackRatMk1" (trackratmk1)
09/14/2015 at 14:22 • Filed to: None

Kinja'd!!!4 Kinja'd!!! 13

Finally, an opinion on autonomous cars we can all get behind! Let’s thank Forbes for the soapbox they so kindly lent this automotive expert economist to stand on.

Ethics Won’t Be a Big Problem for Driverless Cars

Sorry, but ethics is a huge problem for AV’s. In fact most would say it is the BIGGEST problem for AV’s to be adopted with any kind of critical mass. Me included. Let’s dig in.

The central tenet our hero moralist concludes is that AV’s don’t have to be the best drivers ever, they just have to suck less than your average human. How hard can that be? After all, a human did this.

Kinja'd!!!

The real meat and potatoes of this logical fallacy starts off when the author declares that good judgement is not really a necessity to operate motor vehicles.

“The idea that humans will act ethically and wisely while driving is an absurd and false assumption.”

A very good point! However, a critical distinction is whether the humans are making good or bad decisions behind the wheel. Good drivers generally don’t crash, and thereby avoid needlessly squashing fellow motorists and their cars. Bad drivers, when they do these things, go to jail. I believe “vehicular manslaughter” is the trendy vernacular.

Justice, in some way, is served. The topic of what constitutes fair punishment is not ripe for discussion in this article, all that matters is that BAD drivers are punished, and hopefully after rehabilitation, learn to become GOOD drivers. Good drivers get to keep their privilege of having a happy hour punch card. (That’s a drivers license).

You can’t serve justice to a machine. Ever tried kicking the crap out of your printer? It just works even worse than before. And don’t you think we are already giving the driving robots a little too much credit? For mucks sake, it’s like all the Consumer Electronics people think the robots are going to be better drivers than Ayrton Senna. Except they’ll never speed. They’ll never take chances. They’ll talk politely, in binary of course, to all their robot friends and say “no, you sir, you were here first! By golly, I think you have the right of way, after you and good day!” They’ll never get their 1’s and 0’s mixed up, and they’ll never crash.

Just wait I say. These aren’t just computers, they are a mix of mechanical and digital machines. Mechanical things need maintenance, and mechanical things break. And it doesn’t really matter if said breakage tosses a car safely into a sunny meadow on the side of the road, or careening into a schoolbus full of underprivileged puppies. Autonomous accidents may happen only once for every hundred stupid-people accidents. But once is enough.

Once that happens, I can assure you, there will be calls for justice. Someone WILL be held responsible. Will it be the owner, who was Facetiming in the backseat? Or the mechanic who didn’t notice a damaged control arm the last time it was in for service? Or will it be the manufacturer, who built this car and said “hey everyone, our programmers say it’s safe, so let’s start the cue for orders at the Wendy’s down the street from our new Google Auto franchise.”

Right now, responsibility behind the wheel is mostly cut and dry. You crash, you lose. I don’t think society is ready to absolve everyone from guilt, and hold blameless 100% of all actors, in the event of a robot killing a human. It certainly won’t feel like justice to go all Office Space on a Nissan Leaf when it runs over your Mom, even though statistically we are all safer because of it.

Ethics are certainly a big problem for driverless cars, and apparently they are for some economists too.


DISCUSSION (13)


Kinja'd!!! CalzoneGolem > TrackRatMk1
09/14/2015 at 14:29

Kinja'd!!!1

Needs more positronic brain.


Kinja'd!!! spanfucker retire bitch > TrackRatMk1
09/14/2015 at 14:31

Kinja'd!!!1

The central tenet our hero moralist concludes is that AV’s don’t have to be the best drivers ever, they just have to suck less than your average human.

He’s not wrong.

Where the debate comes in is to what degree do they need to be better and more consistent than your average driver?


Kinja'd!!! TrackRatMk1 > spanfucker retire bitch
09/14/2015 at 14:36

Kinja'd!!!0

Did you not get to the part where a robot eventually runs over your mom? Should be no big deal, because you know, we’re all safer now.


Kinja'd!!! TrackRatMk1 > spanfucker retire bitch
09/14/2015 at 14:37

Kinja'd!!!0

His point is they don’t have to perfect. My point is they DO.


Kinja'd!!! TrackRatMk1 > CalzoneGolem
09/14/2015 at 14:38

Kinja'd!!!0

Nice.


Kinja'd!!! spanfucker retire bitch > TrackRatMk1
09/14/2015 at 14:45

Kinja'd!!!0

They will never be perfect. However not even the best drivers in the world are perfect.

So no, they don’t have to be perfect. Just really, really, really good.


Kinja'd!!! spanfucker retire bitch > TrackRatMk1
09/14/2015 at 14:46

Kinja'd!!!0

As opposed to a drunk driver?


Kinja'd!!! TrackRatMk1 > spanfucker retire bitch
09/14/2015 at 14:59

Kinja'd!!!0

I really hope my point wasn’t that hard to follow. Drunk driver will see justice. Robot car? Not so much.


Kinja'd!!! jariten1781 > TrackRatMk1
09/14/2015 at 15:32

Kinja'd!!!2

Morals != Liability

I believe you’re arguing a parallel and important, but different point than the author. His point is that there isn’t as big a moral; and he’s talking about the text-book definition of ‘morals’...not ‘ethics’; conundrum as some are positing (he essentially states that increased safety over all absolves the moral question in aggregate). Individual liability/responsibility for the corner cases is left unanswered in the column.

I’m not sure I agree with that, but the argument isn’t inherently broken. There’s also probably going to be a lot of cross-talk on this issue since the nuances between the definitions of ‘ethics’ and ‘morals’ are lost in common language discussion.


Kinja'd!!! TrackRatMk1 > jariten1781
09/14/2015 at 15:46

Kinja'd!!!0

Interesting take on it. Personally, I think the Forbes author argues quite well on an ethical level. Ethics is essentially the study of morals, which is something every individual has.

What I’ve done is taken his ethical evaluation (statistically safer cars = better for society) to an individual, and thus moral level in order to see how people will respond to these cars when it affects them personally.


Kinja'd!!! jariten1781 > TrackRatMk1
09/14/2015 at 16:02

Kinja'd!!!0

Oh yeah, you can have lots of fun discussing the difference...and like all philosophical questions there isn’t a right answer. I’ve always aligned with the thinkers that believe ‘ethics’ is aligning with an external code and ‘morals’ being inherent to humanity. Therefor someone can be ethical, but morally bankrupt, if they’re willing to perform duties to the letter as defined by an immoral organization (think WWII concentration camp guards).

So I see his argument as a moral one since the ethics of machines choosing who lives and dies in cars haven’t been codified yet.


Kinja'd!!! TrackRatMk1 > jariten1781
09/14/2015 at 16:33

Kinja'd!!!0

Fair play!


Kinja'd!!! spanfucker retire bitch > TrackRatMk1
09/14/2015 at 16:34

Kinja'd!!!0

Doesn’t really change the fact that my mother was hit, now does it?

Autonomous vehicles are going to be a massive legal and cultural shift. Judging them based on current legal and cultural realities I think is going to be a hard thing to do.